BLACK mobile logo

united states

Property is Power! The New Redlining How Algorithms Are Quietly BlockingBlack Homeownership 

March 1, 2026

Artificial intelligence and algorithmic systems now dominate mortgage lending decisions, presenting new forms of discrimination that mirror historical redlining practices but without explicit intent. These automated systems learn from historical data that reflects decades of systematic exclusion of Black Americans from homeownership opportunities, causing algorithms to perpetuate racial disparities through seemingly neutral factors like zip codes and credit histories. Black borrowers, even those who are well-qualified professionals, often face higher interest rates, stricter requirements, and reduced access to prime mortgage products due to these opaque automated decisions.

Who is affected

  • Black Americans broadly, particularly those historically denied access to credit and housing
  • College-educated Black professionals
  • Black first-time homebuyers
  • Black families seeking to build generational wealth through homeownership
  • Borrowers with gaps in credit histories due to historical exclusion

What action is being taken

  • No actions are explicitly described as currently ongoing in the article. The article discusses historical discrimination and current algorithmic practices, but proposes only future actions that "can and should be taken" rather than describing actions currently underway.

Why it matters

  • Algorithmic bias in lending perpetuates wealth inequality by imposing higher borrowing costs that reduce equity accumulation and delay homeownership for Black families, compounding racial wealth gaps over time. Unlike historical discrimination, these automated systems operate without transparency or accountability, making disparate outcomes difficult to identify and challenge. Homeownership represents the primary engine of wealth creation in the United States, so restricted access limits economic power, community stability, and intergenerational advancement for Black Americans. The issue is particularly significant because discrimination now occurs without intent through systems falsely perceived as objective and race-neutral.

What's next

  • The article proposes four recommended steps: requiring lenders to audit algorithmic outcomes for disparate impact and provide explanations for adverse decisions; responsibly expanding use of alternative data like rent and utility payment history; evolving consumer education to help borrowers understand data-driven lending environments; and modernizing fair-lending enforcement laws to address algorithmic systems. However, these are recommendations rather than explicitly stated next steps currently planned or underway.

Read full article from source: Michigan Chronicle